Goto

Collaborating Authors

 message passing



HowPowerfulareK-hopMessagePassingGraph NeuralNetworks

Neural Information Processing Systems

Recently,researchers extended 1-hop message passing to K-hop message passing by aggregating information fromK-hop neighbors of nodes simultaneously. However, there is no work on analyzing the expressive powerofK-hopmessagepassing.


Title

Author

Neural Information Processing Systems

A common approach to create more expressive GNNs is to change the message passing function of MPNNs. If a GNN is more expressive than MPNNs by adapting the message passing function, we call this non-standard message passing . Examples of this are message passing variants that operate on subgraphs [Frasca et al., 2022, Bevilacqua




A Appendix 399 A.1 Message Passing in SyncTREE

Neural Information Processing Systems

It should be noted that we only made a little modification to the GraphTrans model. For NTREE, we set GA T as its basic block with a 0.2 dropout probability between layers.



Diverse Message Passingfor Attributewith Heterophily

Neural Information Processing Systems

Byfollowingthederivationof Theorem 4, thefollowingtheoremfor Diverse Message Passing. 5. se Message Passingin Eq. (4) actuallypartitionsgraphinto2 connected (basedoneachsimilarityisvia ., F.(14) nk2.